Individually Conditional Individual Mutual Information Bound on Generalization Error

نویسندگان

چکیده

We propose an information-theoretic bound on the generalization error based a combination of decomposition technique Bu et al. and conditional mutual information (CMI) construction Steinke Zakynthinou. In previous work, Haghifam proposed different combining two aforementioned techniques, which we refer to as individual (CIMI) bound. However, in simple Gaussian setting, both CMI CIMI bounds are order-wise worse than that by This observation motivated us bound, overcomes this issue reducing conditioning terms information. process establishing decoupling lemma is established, also leads meaningful dichotomy comparison among these bounds. As application analyze noisy iterative stochastic gradient Langevin dynamics provide upper its error.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Mutual Information and Conditional Mean Prediction Error

Mutual information is fundamentally important for measuring statistical dependence between variables and for quantifying information transfer by signaling and communication mechanisms. It can, however, be challenging to evaluate for physical models of such mechanisms and to estimate reliably from data. Furthermore, its relationship to better known statistical procedures is still poorly understo...

متن کامل

The Intrinsic Conditional Mutual Information

This paper is concerned with secret key agreement by public discussion: two parties Alice and Bob and an adversary Eve have access to independent realizations of random variables X, Y , and Z, respectively, with joint distribution P XY Z. The secret key rate S(X; Y jjZ) has been deened as the maximal rate at which Alice and Bob can generate a secret key by communication over an insecure, but au...

متن کامل

Directed Information and Conditional Mutual Information

—We study directed information in Bayesian networks and related structures. Mutual information is split into directed information and residual information. Some basic equations for directed information and residual information are determined.

متن کامل

The Intrinsic Conditional Mutual Information andPerfect

This paper is concerned with secret key agreement by public discussion: two parties Alice and Bob and an adversary Eve have access to independent realizations of random variables X, Y , and Z, respectively, with joint distribution P XY Z. The secret key rate S(X; Y jjZ) has been deened as the maximal rate at which Alice and Bob can generate a secret key by communication over an insecure, but au...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Information Theory

سال: 2022

ISSN: ['0018-9448', '1557-9654']

DOI: https://doi.org/10.1109/tit.2022.3144615